Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Dev/v0.2 - Enable streaming support for openai v1.0.0b3 #491

Closed
wants to merge 3 commits into from

Conversation

Alvaromah
Copy link
Collaborator

Why are these changes needed?

The OpenAI API, along with other LLM frameworks, offers streaming capabilities that enhance debugging workflows by eliminating the need to wait for complete responses, resulting in a more efficient and time-saving process.

This is a simple mechanism to support streaming.
Tested on openai v1.0.0b3.

To enable streaming just use this code:

llm_config={
    "config_list": config_list,
    # Enable, disable streaming (defaults to False)
    "stream": True,
}

assistant = autogen.AssistantAgent(
    name="assistant",
    llm_config=llm_config,
)

Related issue number

Related to #465, #217

Checks

Copy link
Contributor

@sonichi sonichi left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let me find a few reviewers who are interested in streaming support. Feel free to invite if you know any.
Meanwhile, the structure looks good to me. It's a good time to add a test.

else:
# If streaming is not enabled, send a regular chat completion request
# Ensure streaming is disabled
params["stream"] = False
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is this needed? I'd like to avoid modifying the original dict if not necessary.

autogen/oai/client.py Show resolved Hide resolved
@ragyabraham
Copy link
Collaborator

This works great for me. We've also managed to integrate it with #394 (see gif below).

streaming with sockets

Given that streaming is essentially a UX centric issue, it would probably make sense to open another PR to formally integrate streaming with some sort of messaging framework (e.g. sockets).

Copy link
Collaborator

@ragyabraham ragyabraham left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@pq-dong
Copy link

pq-dong commented Nov 2, 2023

Are there any plans to merge into microsoft:dev/v0.2?
When to be merged into microsoft:dev/v0.2 ?

@sonichi sonichi requested review from pcdeadeasy and a team November 2, 2023 03:42
@sonichi
Copy link
Contributor

sonichi commented Nov 2, 2023

Are there any plans to merge into microsoft:dev/v0.2? When to be merged into microsoft:dev/v0.2 ?

After the conversation is resolved, and no further revision request from other reviewers.

@franciscoabenza
Copy link

franciscoabenza commented Nov 2, 2023

@ragyabraham 🙏 Please please please, can you explain me how are you running the UI?/ is it part of your autogen branch?

streaming with sockets

@ragyabraham
Copy link
Collaborator

Hey @franciscoabenza yeah, this a web UI we are building for autogen. You can find it here

@sonichi sonichi deleted the branch microsoft:dev/v0.2 November 4, 2023 04:33
@sonichi sonichi closed this Nov 4, 2023
@sonichi
Copy link
Contributor

sonichi commented Nov 5, 2023

dev/v0.2 is merged into main. Could you recreate the PR to target main? @Alvaromah

@Alvaromah
Copy link
Collaborator Author

dev/v0.2 is merged into main. Could you recreate the PR to target main? @Alvaromah

Sure! I will, as soon as possible.

@Alvaromah
Copy link
Collaborator Author

dev/v0.2 is merged into main. Could you recreate the PR to target main? @Alvaromah

Created PR #597 to target main.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants